Skip to content

Conversation

@ItsOnlyBinary
Copy link
Contributor

@ItsOnlyBinary ItsOnlyBinary commented Sep 4, 2025

Related GitHub Issue

Closes: #7674

Roo Code Task Context (Optional)

Description

The task view while chatting was not showing ollama max context length.
I looked how LM Studio had been changed to retrieve its model info within the ui.
I used that as a template to change how ollama worked so the info is now shown.

Test Procedure

Start chatting with ollama
In Task Header you will see tokens usage as: used/8k
On current version it will show: used/1

Pre-Submission Checklist

  • Issue Linked: This PR is linked to an approved GitHub Issue (see "Related GitHub Issue" above).
  • Scope: My changes are focused on the linked issue (one major feature/fix per PR).
  • Self-Review: I have performed a thorough self-review of my code.
  • Testing: New and/or updated tests have been added to cover my changes (if applicable).
  • Documentation Impact: I have considered if my changes require documentation updates (see "Documentation Updates" section below).
  • Contribution Guidelines: I have read and agree to the Contributor Guidelines.

Screenshots / Videos

before:
image

after:
image

Documentation Updates

Additional Notes

Get in Touch


Important

Update Ollama model handling to fetch and display detailed model information, aligning with LM Studio's approach.

  • Behavior:
    • Update webviewMessageHandler to fetch Ollama models with full details instead of just IDs.
    • Modify requestOllamaModels to flush cache and fetch fresh models.
    • Display Ollama model details in the UI, similar to LM Studio.
  • UI Components:
    • Update Ollama.tsx to handle and display detailed Ollama models.
    • Add useOllamaModels hook to fetch and manage Ollama models.
    • Modify useSelectedModel to include Ollama models in selection logic.
  • Types:
    • Change ollamaModels type in ExtensionMessage.ts from string[] to ModelRecord.

This description was created by Ellipsis for 68df30d. You can customize this summary. It will automatically update as commits are pushed.

@ItsOnlyBinary ItsOnlyBinary requested a review from jr as a code owner September 4, 2025 18:41
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. UI/UX UI/UX related or focused labels Sep 4, 2025
Copy link
Contributor

@roomote roomote bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for your contribution! I've reviewed the changes and this PR successfully fixes the Ollama max context window display issue. The implementation follows the same pattern as LM Studio, which provides good consistency.

✅ Positive Aspects

  • Correctly addresses the root cause where info was always undefined
  • Good consistency with LM Studio's implementation pattern
  • Comprehensive test coverage added for the new functionality
  • Type changes are properly propagated through the codebase

💭 Minor Suggestions for Improvement

  1. Cache flushing in requestOllamaModels handler (webviewMessageHandler.ts line ~662)

    • The cache flush ensures fresh models, which is good for accuracy
    • However, if this handler is called frequently (e.g., on every settings page load), it might impact performance
    • Consider whether the cache flush is always necessary, or if it could be conditional
  2. Error message clarity (useOllamaModels.ts line ~16)

    • The timeout error message could be more helpful
    • Consider: "Ollama models request timed out. Please check if Ollama is running and accessible at the configured URL."
  3. Conditional fetching logic (useOllamaModels.ts line ~39)

    • The hook only fetches models when modelId is provided
    • This differs from the LM Studio implementation
    • While this seems intentional, a comment explaining the reasoning would help future maintainers

Overall Assessment

The implementation is solid and fixes the reported issue effectively. The code quality is good and follows existing patterns. The minor suggestions above are optional improvements that don't block the PR.

@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Sep 4, 2025
@daniel-lxs daniel-lxs moved this from Triage to PR [Needs Prelim Review] in Roo Code Roadmap Sep 5, 2025
@hannesrudolph hannesrudolph added PR - Needs Preliminary Review and removed Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. labels Sep 5, 2025
Copy link
Member

@daniel-lxs daniel-lxs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @ItsOnlyBinary!

@daniel-lxs daniel-lxs moved this from PR [Needs Prelim Review] to PR [Needs Review] in Roo Code Roadmap Sep 8, 2025
@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Sep 8, 2025
@mrubens mrubens merged commit 76c6745 into RooCodeInc:main Sep 9, 2025
29 checks passed
@github-project-automation github-project-automation bot moved this from PR [Needs Review] to Done in Roo Code Roadmap Sep 9, 2025
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Sep 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

lgtm This PR has been approved by a maintainer PR - Needs Review size:L This PR changes 100-499 lines, ignoring generated files. UI/UX UI/UX related or focused

Projects

Archived in project

Development

Successfully merging this pull request may close these issues.

Task display of max context window broken with ollama

4 participants